textabstractLearning and exploiting problem structure is one of the key challenges in\udoptimization. This is especially important for black-box optimization (BBO)\udwhere prior structural knowledge of a problem is not available. Existing\udmodel-based Evolutionary Algorithms (EAs) are very efficient at learning\udstructure in both the discrete, and in the continuous domain. In this paper,\uddiscrete and continuous model-building mechanisms are integrated for the\udMixed-Integer (MI) domain, comprising discrete and continuous variables.\ud\udWe revisit a recently introduced model-based evolutionary algorithm for the MI\uddomain, the Genetic Algorithm for Model-Based mixed-Integer opTimization\ud(GAMBIT). We extend GAMBIT with a parameterless scheme that allows for practical\uduse of the algorithm without the need to explicitly specify any parameters. We\udfurthermore contrast GAMBIT with other model-based alternatives. The ultimate\udgoal of processing mixed dependences explicitly in GAMBIT is also addressed by\udintroducing a new mechanism for the explicit exploitation of mixed dependences.\udWe find that processing mixed dependences with this novel mechanism allows for\udmore efficient optimization.\ud\udWe further contrast the parameterless GAMBIT with Mixed-Integer Evolution\udStrategies (MIES) and other state-of-the-art MI optimization algorithms from\udthe General Algebraic Modeling System (GAMS) commercial algorithm suite on\udproblems with and without constraints, and show that GAMBIT is capable of\udsolving problems where variable dependences prevent many algorithms from\udsuccessfully optimizing them.
展开▼